16 research outputs found
Design knowledge capture for a corporate memory facility
Currently, much of the information regarding decision alternatives and trade-offs made in the course of a major program development effort is not represented or retained in a way that permits computer-based reasoning over the life cycle of the program. The loss of this information results in problems in tracing design alternatives to requirements, in assessing the impact of change in requirements, and in configuration management. To address these problems, the problem was studied of building an intelligent, active corporate memory facility which would provide for the capture of the requirements and standards of a program, analyze the design alternatives and trade-offs made over the program's lifetime, and examine relationships between requirements and design trade-offs. Early phases of the work have concentrated on design knowledge capture for the Space Station Freedom. Tools are demonstrated and extended which helps automate and document engineering trade studies, and another tool is being developed to help designers interactively explore design alternatives and constraints
Known and unknown requirements in healthcare
We report experience in requirements elicitation of domain knowledge from experts in clinical and cognitive neurosciences. The elicitation target was a causal model for early signs of dementia indicated by changes in user behaviour and errors apparent in logs of computer activity. A Delphi-style process consisting of workshops with experts followed by a questionnaire was adopted. The paper describes how the elicitation process had to be adapted to deal with problems encountered in terminology and limited consensus among the experts. In spite of the difficulties encountered, a partial causal model of user behavioural pathologies and errors was elicited. This informed requirements for configuring data- and text-mining tools to search for the specific data patterns. Lessons learned for elicitation from experts are presented, and the implications for requirements are discussed as “unknown unknowns”, as well as configuration requirements for directing data-/text-mining tools towards refining awareness requirements in healthcare applications
Cleaner burning aviation fuels can reduce contrail cloudiness
Contrail cirrus account for the major share of aviation’s climate impact. Yet, the links between jet fuel composition, contrail microphysics and climate impact remain unresolved. Here we present unique observations from two DLR-NASA aircraft campaigns that measured exhaust and contrail characteristics of an Airbus A320 burning either standard jet fuels or low aromatic sustainable aviation fuel blends. Our results show that soot particles can regulate the number of contrail cirrus ice crystals for current emission levels. We provide experimental evidence that burning low aromatic sustainable aviation fuel can result in a 50 to 70% reduction in soot and ice number concentrations and an increase in ice crystal size. Reduced contrail ice numbers cause less energy deposition in the atmosphere and less warming. Meaningful reductions in aviation’s climate impact could therefore be obtained from the widespread adoptation of low aromatic fuels, and from regulations to lower the maximum aromatic fuel content
Recommended from our members
Quantity is Nothing without Quality: Automated QA/QC for Streaming Environmental Sensor Data
Sensor networks are revolutionizing environmental monitoring by producing massive quantities of data that are being made publically available in near real time. These data streams pose a challenge for ecologists because traditional approaches to quality assurance and quality control are no longer practical when confronted with the size of these data sets and the demands of real-time processing. Automated methods for rapidly identifying and (ideally) correcting problematic data are essential. However, advances in sensor hardware have outpaced those in software, creating a need for tools to implement automated quality assurance and quality control procedures, produce graphical and statistical summaries for review, and track the provenance of the data. Use of automated tools would enhance data integrity and reliability and would reduce delays in releasing data products. Development of community-wide standards for quality assurance and quality control would instill confidence in sensor data and would improve interoperability across environmental sensor networks.Keywords: informatics, instrumentation, environmental science, computers in biolog